Data Compression at Low Power Using Soft Competitive Learning
نویسندگان
چکیده
This paper examines a variety of issues relating to the analog hardware implementation of the soft competitive neural learning algorithm and its suitability for use in data compression applications. Specifically, we investigate the impact of realizing the theoretical learning algorithm in imperfect analog structures constructed in a traditional CMOS fabrication process. Empirical measurements of previously fabricated neural circuit elements are used to produce suitable hardware models which accurately characterize the fabrication process and a typical operating environment. These models are used to evaluate the tolerance of the soft competitive learning algorithm to expected system variations including various noise and device effects. The analog neural circuits make extensive use of a CMOS implementation of the Gilbert multiplier which is the primary computational element for our learning computations. We have found, through the aid of simulations based on these hardware models, that the circuit effects are not significant if zero-thresholding is used to compensate for multiplier zero-crossing offsets. These results indicate that this algorithm is very robust in the presence of moderate circuit limitations. As a result, such circuits would be well suited for applications requiring data compression with low power consumption, as might be encountered in the production of compact consumer products for portable computing.
منابع مشابه
Competitive learning algorithms for robust vector quantization
|The e cient representation and encoding of signals with limited resources, e.g., nite storage capacity and restricted transmission bandwidth, is a fundamental problem in technical as well as biological information processing systems. Typically, under realistic circumstances, the encoding and communication of messages has to deal with different sources of noise and disturbances. In this paper, ...
متن کاملImage compression using frequency sensitive competitive neural network
Vector Quantization is one of the most powerful techniques used for speech and image compression at medium to low bit rates. Frequency Sensitive Competitive Learning algorithm (FSCL) is particularly effective for adaptive vector quantization in image compression systems. This paper presents a compression scheme for grayscale still images, by using this FSCL method. In this paper, we have genera...
متن کاملSystematic Structuring of the Business Domain of Local Mobile Apps Stores Using Soft Systems Methodology (SSM)
Due to the global competitive environment in the mobile app market, traditional problem-solving methods in examining the problem of accepting stores offering these digital products have ignored the important role of human factors and therefore this weakness necessitates research on relevant policies by governing bodies from another perspective based on a soft systems thinking approach. This pro...
متن کاملA Frequency-Sensitive Competitive Learning Networks with Hadamard Transform Applied to Color Image Compression
The neural network is useful for data compression if the connection weights are chosen properly. In this paper, a Modified Frequency-Sensitive Competitive Learning (MFSCL) network with Hadamard transform based on Vector Quantization (VQ) for color image compression is presented. The goal is apply a spread-unsupervised scheme based on the modified competitive learning networks so that on-line le...
متن کاملDeblocking Joint Photographic Experts Group Compressed Images via Self-learning Sparse Representation
JPEG is one of the most widely used image compression method, but it causes annoying blocking artifacts at low bit-rates. Sparse representation is an efficient technique which can solve many inverse problems in image processing applications such as denoising and deblocking. In this paper, a post-processing method is proposed for reducing JPEG blocking effects via sparse representation. In this ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1997